Search results for "Slice sampling"

showing 6 items of 6 documents

Group Metropolis Sampling

2017

Monte Carlo (MC) methods are widely used for Bayesian inference and optimization in statistics, signal processing and machine learning. Two well-known class of MC methods are the Importance Sampling (IS) techniques and the Markov Chain Monte Carlo (MCMC) algorithms. In this work, we introduce the Group Importance Sampling (GIS) framework where different sets of weighted samples are properly summarized with one summary particle and one summary weight. GIS facilitates the design of novel efficient MC techniques. For instance, we present the Group Metropolis Sampling (GMS) algorithm which produces a Markov chain of sets of weighted samples. GMS in general outperforms other multiple try schemes…

Computer scienceMonte Carlo methodMarkov processSlice samplingProbability density function02 engineering and technologyMultiple-try MetropolisBayesian inferenceMachine learningcomputer.software_genre01 natural sciencesHybrid Monte Carlo010104 statistics & probabilitysymbols.namesake[INFO.INFO-TS]Computer Science [cs]/Signal and Image Processing0202 electrical engineering electronic engineering information engineering0101 mathematicsComputingMilieux_MISCELLANEOUSMarkov chainbusiness.industryRejection samplingSampling (statistics)020206 networking & telecommunicationsMarkov chain Monte CarloMetropolis–Hastings algorithmsymbolsMonte Carlo method in statistical physicsMonte Carlo integrationArtificial intelligencebusinessParticle filter[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processingcomputerAlgorithmImportance samplingMonte Carlo molecular modeling
researchProduct

Recycling Gibbs sampling

2017

Gibbs sampling is a well-known Markov chain Monte Carlo (MCMC) algorithm, extensively used in signal processing, machine learning and statistics. The key point for the successful application of the Gibbs sampler is the ability to draw samples from the full-conditional probability density functions efficiently. In the general case this is not possible, so in order to speed up the convergence of the chain, it is required to generate auxiliary samples. However, such intermediate information is finally disregarded. In this work, we show that these auxiliary samples can be recycled within the Gibbs estimators, improving their efficiency with no extra cost. Theoretical and exhaustive numerical co…

Computer scienceMonte Carlo methodSlice samplingMarkov processProbability density function02 engineering and technologyMachine learningcomputer.software_genre01 natural sciencesHybrid Monte Carlo010104 statistics & probabilitysymbols.namesake[INFO.INFO-TS]Computer Science [cs]/Signal and Image Processing0202 electrical engineering electronic engineering information engineering0101 mathematicsComputingMilieux_MISCELLANEOUSbusiness.industryRejection samplingEstimator020206 networking & telecommunicationsMarkov chain Monte CarlosymbolsArtificial intelligencebusiness[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processingcomputerAlgorithmGibbs sampling2017 25th European Signal Processing Conference (EUSIPCO)
researchProduct

The Recycling Gibbs sampler for efficient learning

2018

Monte Carlo methods are essential tools for Bayesian inference. Gibbs sampling is a well-known Markov chain Monte Carlo (MCMC) algorithm, extensively used in signal processing, machine learning, and statistics, employed to draw samples from complicated high-dimensional posterior distributions. The key point for the successful application of the Gibbs sampler is the ability to draw efficiently samples from the full-conditional probability density functions. Since in the general case this is not possible, in order to speed up the convergence of the chain, it is required to generate auxiliary samples whose information is eventually disregarded. In this work, we show that these auxiliary sample…

FOS: Computer and information sciencesMonte Carlo methodSlice samplingInferenceMachine Learning (stat.ML)02 engineering and technologyBayesian inferenceStatistics - Computation01 natural sciencesMachine Learning (cs.LG)010104 statistics & probabilitysymbols.namesake[INFO.INFO-TS]Computer Science [cs]/Signal and Image ProcessingStatistics - Machine LearningArtificial IntelligenceStatistics0202 electrical engineering electronic engineering information engineering0101 mathematicsElectrical and Electronic EngineeringGaussian processComputation (stat.CO)ComputingMilieux_MISCELLANEOUSMathematicsChain rule (probability)Applied Mathematics020206 networking & telecommunicationsMarkov chain Monte CarloStatistics::ComputationComputer Science - LearningComputational Theory and MathematicsSignal ProcessingsymbolsComputer Vision and Pattern RecognitionStatistics Probability and UncertaintyAlgorithm[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processingGibbs samplingDigital Signal Processing
researchProduct

Avoiding Boundary Effects in Wang-Landau Sampling

2003

A simple modification of the ``Wang-Landau sampling'' algorithm removes the systematic error that occurs at the boundary of the range of energy over which the random walk takes place in the original algorithm.

Heterogeneous random walk in one dimensionStatistical Mechanics (cond-mat.stat-mech)Rejection samplingFOS: Physical sciencesSlice samplingSampling (statistics)Boundary (topology)Random walk01 natural sciences010305 fluids & plasmasCombinatorics0103 physical sciencesRange (statistics)Applied mathematics010306 general physicsEnergy (signal processing)Condensed Matter - Statistical MechanicsMathematics
researchProduct

Anti-tempered Layered Adaptive Importance Sampling

2017

Monte Carlo (MC) methods are widely used for Bayesian inference in signal processing, machine learning and statistics. In this work, we introduce an adaptive importance sampler which mixes together the benefits of the Importance Sampling (IS) and Markov Chain Monte Carlo (MCMC) approaches. Different parallel MCMC chains provide the location parameters of the proposal probability density functions (pdfs) used in an IS method. The MCMC algorithms consider a tempered version of the posterior distribution as invariant density. We also provide an exhaustive theoretical support explaining why, in the presented technique, even an anti-tempering strategy (reducing the scaling of the posterior) can …

Mathematical optimizationRejection samplingSlice sampling020206 networking & telecommunicationsMarkov chain Monte Carlo02 engineering and technology01 natural sciencesStatistics::ComputationHybrid Monte Carlo010104 statistics & probabilitysymbols.namesakeMetropolis–Hastings algorithm[INFO.INFO-TS]Computer Science [cs]/Signal and Image Processing0202 electrical engineering electronic engineering information engineeringsymbolsParallel tempering0101 mathematicsParticle filter[SPI.SIGNAL]Engineering Sciences [physics]/Signal and Image processingImportance samplingComputingMilieux_MISCELLANEOUSMathematics
researchProduct

Monte-Carlo Methods

2003

The article conbtains sections titled: 1 Introduction and Overview 2 Random-Number Generation 2.1 General Introduction 2.2 Properties That a Random-Number Generator (RNG) Should Have 2.3 Comments about a Few Frequently Used Generators 3 Simple Sampling of Probability Distributions Using Random Numbers 3.1 Numerical Estimation of Known Probability Distributions 3.2 “Importance Sampling” versus “Simple Sampling” 3.3 Monte-Carlo as a Method of Integration 3.4 Infinite Integration Space 3.5 Random Selection of Lattice Sites 3.6 The Self-Avoiding Walk Problem 3.7 Simple Sampling versus Biased Sampling: the Example of SAWs Continued 4 Survey of Applications to Simulation of Transport Processes 4.…

Rejection samplingMonte Carlo methodSlice samplingSampling (statistics)Monte Carlo method in statistical physicsStatistical physicsStatistical mechanicsUmbrella samplingImportance samplingMathematics
researchProduct